Incentives and Truthful Reporting in Consensus-centric Crowdsourcing

نویسندگان

  • Ece Kamar
  • Eric Horvitz
چکیده

We address the challenge in crowdsourcing systems of incentivizing people to contribute to the best of their abilities. We focus on the class of crowdsourcing tasks where contributions are provided in pursuit of a single correct answer. This class includes citizen science efforts that seek input from people with identifying events and states in the world. We introduce a new payment rule, called consensus prediction rule, which uses the consensus of other workers to evaluate the report of a worker. We compare this rule to another payment rule which is an adaptation of the peer prediction rule introduced by Miller, Resnick and Zeckhauser to the domain of crowdsourcing. We show that while both rules promote truthful reporting, the consensus prediction rule has better fairness properties. We present analytical and empirical studies of the behavior of these rules on a noisy, real-world scenario where common knowledge assumptions do not necessarily hold.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Incentives for truthful reporting in crowdsourcing

A challenge with the programmatic access of human talent via crowdsourcing platforms is the specification of incentives and the checking of the quality of contributions. Methodologies for checking quality include providing a payment if the work is approved by the task owner and hiring additional workers to evaluate contributors’ work. Both of these approaches place a burden on people and on the...

متن کامل

Peer Truth Serum: Incentives for Crowdsourcing Measurements and Opinions

Modern decision making tools are based on statistical analysis of abundant data, which is often collected by querying multiple individuals. We consider data collection through crowdsourcing, where independent and self-interested agents, non-experts, report measurements, such as sensor readings, opinions, such as product reviews, or answers to human intelligence tasks. Since the accuracy of info...

متن کامل

Learning the Prior in Minimal Peer Prediction

Many crowdsourcing applications rely on the truthful elicitation of information from workers; e.g., voting on the quality of an image label, or whether a website is inappropriate for an advertiser. Peer prediction provides a theoretical mechanism for eliciting truthful reports. However, its application depends on knowledge of a full probabilistic model: both a distribution on votes, and a poste...

متن کامل

Incentives for Truthful Evaluations

We consider crowdsourcing problems where the users are asked to provide evaluations for items; the user evaluations are then used directly, or aggregated into a consensus value. Lacking an incentive scheme, users have no motive in making effort in completing the evaluations, providing inaccurate answers instead. We propose incentive schemes that are truthful and cheap: truthful as the optimal u...

متن کامل

Incentives for Subjective Evaluations with Private Beliefs

The modern web critically depends on aggregation of information from self-interested agents, for example opinion polls, product ratings, or crowdsourcing. We consider a setting where multiple objects (questions, products, tasks) are evaluated by a group of agents. We first construct a minimal peer prediction mechanism that elicits honest evaluations from a homogeneous population of agents with ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012